# Spanish Masked Language Modeling
Bertin Base Random
This is a model based on the RoBERTa-base architecture, fully trained from scratch using Spanish data, specializing in masked language modeling tasks.
Large Language Model Spanish
B
bertin-project
19
0
Roberta Base Bne
Apache-2.0
Spanish masked language model based on RoBERTa architecture, trained on 570GB of cleaned text from the Spanish National Library
Large Language Model
Transformers Spanish

R
PlanTL-GOB-ES
28.76k
29
Featured Recommended AI Models